Search results for "statistical [Methods]"

showing 10 items of 1664 documents

Approximation of functions over manifolds : A Moving Least-Squares approach

2021

We present an algorithm for approximating a function defined over a $d$-dimensional manifold utilizing only noisy function values at locations sampled from the manifold with noise. To produce the approximation we do not require any knowledge regarding the manifold other than its dimension $d$. We use the Manifold Moving Least-Squares approach of (Sober and Levin 2016) to reconstruct the atlas of charts and the approximation is built on-top of those charts. The resulting approximant is shown to be a function defined over a neighborhood of a manifold, approximating the originally sampled manifold. In other words, given a new point, located near the manifold, the approximation can be evaluated…

Computational Geometry (cs.CG)FOS: Computer and information sciencesComputer Science - Machine LearningClosed manifolddimension reductionMachine Learning (stat.ML)010103 numerical & computational mathematicsComplex dimensionTopology01 natural sciencesMachine Learning (cs.LG)Volume formComputer Science - GraphicsStatistics - Machine Learningmanifold learningApplied mathematics0101 mathematicsfunktiotMathematicsManifold alignmentAtlas (topology)Applied Mathematicshigh dimensional approximationManifoldGraphics (cs.GR)Statistical manifold010101 applied mathematicsregression over manifoldsComputational Mathematicsout-of-sample extensionComputer Science - Computational Geometrynumeerinen analyysimonistotapproksimointimoving least-squaresCenter manifold
researchProduct

A Wideband MIMO Channel Model Derived From the Geometric Elliptical Scattering Model

2006

In this paper, we present a reference model for a wideband multiple-input multiple-output (MIMO) channel based on the geometric elliptical scattering model. The model takes into account the exact relationship between the angle of departure (AOD) and the angle of arrival (AOA). Based on this relationship, the statistical properties of the reference model are studied. Analytical solutions are presented for the three- dimensional (3D) space-time cross-correlation function (CCF), the temporal autocorrelation function (ACF), the 2D space CCF, and finally the frequency correlation function (FCF). The correlation properties are studied and visualized under the assumption of isotropic as well as no…

Computer Networks and CommunicationsScatteringComputer scienceAutocorrelationMathematical analysisMIMOCorrelation function (statistical mechanics)Channel capacityAngle of arrivalElectrical and Electronic EngineeringWidebandReference modelInformation SystemsComputer Science::Information Theory2006 3rd International Symposium on Wireless Communication Systems
researchProduct

Thermodynamics of the Classical Planar Ferromagnet Close to the Zero-Temperature Critical Point: A Many-Body Approach

2012

We explore the low-temperature thermodynamic properties and crossovers of ad-dimensional classical planar Heisenberg ferromagnet in a longitudinal magnetic field close to its field-induced zero-temperature critical point by employing the two-time Green’s function formalism in classical statistical mechanics. By means of a classical Callen-like method for the magnetization and the Tyablikov-like decoupling procedure, we obtain, for anyd, a low-temperature critical scenario which is quite similar to the one found for the quantum counterpart. Remarkably, ford>2the discrimination between the two cases is found to be related to the different values of the shift exponent which governs the beha…

Computer Science::Machine LearningPhysicsArticle SubjectCondensed matter physicsThermodynamicsStatistical mechanicsCondensed Matter PhysicsComputer Science::Digital Librarieslcsh:QC1-999Statistics::Machine LearningReduced propertiesCritical point (thermodynamics)Critical lineComputer Science::Mathematical SoftwareExponentCritical exponentQuantumlcsh:PhysicsPhase diagramAdvances in Condensed Matter Physics
researchProduct

Estimation of confidence limits for descriptive indexes derived from autoregressive analysis of time series: Methods and application to heart rate va…

2017

The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear funct…

Computer and Information SciencesStatistical methodsConfidence Intervals; Humans; Monte Carlo Method; Regression Analysis; Heart Rate; Biochemistry Genetics and Molecular Biology (all); Agricultural and Biological Sciences (all)EntropyCardiologylcsh:MedicineResearch and Analysis MethodsSystems ScienceRegression AnalysiHeart RateConfidence IntervalsMedicine and Health SciencesHumanslcsh:ScienceBiochemistry Genetics and Molecular Biology (all)Simulation and ModelingPhysicslcsh:RProbability TheoryMonte Carlo methodAgricultural and Biological Sciences (all)Nonlinear DynamicsWhite NoiseSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaPhysical SciencesSignal ProcessingMathematical and statistical techniquesThermodynamicsEngineering and TechnologyRegression Analysislcsh:QConfidence IntervalMathematicsStatistics (Mathematics)HumanResearch ArticleStatistical DistributionsPLoS ONE
researchProduct

An Estimative Model of Automated Valuation Method in Italy

2017

The Automated Valuation Method (AVM) is a computer software program that analyzes data using an automated process. It is related to the process of appraising an universe of real estate properties, using common data and standard appraisal methodologies. Generally, the AVM is based on quantitative models (statistical, mathematical, econometric, etc.), related to the valuation of the properties gathered in homogeneous groups (by use and location) for which are collected samples of market data. The real estate data are collected regularly and systematically. Within the AVM, the proposed valuation scheme is an uniequational model to value properties in terms of widespread availability of sample …

Computer science0211 other engineering and technologies021107 urban & regional planningReal estateStatistical model02 engineering and technologyMarket segmentationHomogeneousAVM Valuation Market segment Appraisal functionLinear form021105 building & constructionComputer softwareMarket dataEconometricsSettore ICAR/22 - EstimoOperations managementValuation (finance)
researchProduct

Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

2013

Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less). Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing…

Computer scienceAdaptive Immunitycomputer.software_genre0302 clinical medicineSingle-cell analysisEnumerationBiology (General)Immune ResponseEvent (probability theory)0303 health sciencesEcologymedicine.diagnostic_testT CellsStatisticsFlow Cytometry3. Good healthComputational Theory and MathematicsData modelModeling and SimulationMedicineData miningImmunotherapyResearch ArticleTumor ImmunologyQH301-705.5Immune CellsImmunologyContext (language use)BiostatisticsModels BiologicalFlow cytometry03 medical and health sciencesCellular and Molecular NeuroscienceGeneticsmedicineHumansSensitivity (control systems)Statistical MethodsImmunoassaysMolecular BiologyBiologyEcology Evolution Behavior and Systematics030304 developmental biologybusiness.industryImmunityReproducibility of ResultsPattern recognitionStatistical modelImmunologic SubspecialtiesLymphocyte SubsetsImmunologic TechniquesClinical ImmunologyArtificial intelligencebusinesscomputerMathematics030215 immunologyPLoS computational biology
researchProduct

Geometrical Modeling of Non-Stationary Polarimetric Vehicular Radio Channels

2019

This paper presents a geometry-based statistical model (GBSM) of polarimetric wideband multipath radio channels for vehicle-to-vehicle (V2V) communications. The proposed model captures the effects of depolarization caused by multipath propagation, and it also accounts for the non-stationary characteristics of wideband V2V channels. This is a novel feature, because the existing polarimetric channel models are built on the assumption that the channel is a wide-sense stationary random process. In the modeling framework described in this paper, the channel depolarization function is given by a linear transformation in the form of a simple rotation matrix. This linear transformation is transpare…

Computer scienceComputationPolarimetryStatistical modelComputingMilieux_LEGALASPECTSOFCOMPUTINGRotation matrixTopologyLinear mapComputer Science::Networking and Internet ArchitectureWidebandVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550Multipath propagationComputer Science::Information TheoryCommunication channel
researchProduct

Methodological considerations for interrupted time series analysis in radiation epidemiology: an overview

2021

Interrupted time series analysis (ITSA) is a method that can be applied to evaluate health outcomes in populations exposed to ionizing radiation following major radiological events. Using aggregated time series data, ITSA evaluates whether the time trend of a health indicator shows a change associated with the radiological event. That is, ITSA checks whether there is a statistically significant discrepancy between the projection of a pre-event trend and the data empirically observed after the event. Conducting ITSA requires one to consider specific methodological issues due to unique threats to internal validity that make ITSA prone to bias. We here discuss the strengths and limitations of …

Computer scienceConfoundingPublic Health Environmental and Occupational HealthInterrupted Time Series AnalysisStatistical modelGeneral MedicineHealth indicatorInterrupted Time Series AnalysisResearch DesignData qualityEconometricsInternal validityTime seriesSpurious relationshipWaste Management and DisposalForecastingJournal of Radiological Protection
researchProduct

Entropy-Based Classifier Enhancement to Handle Imbalanced Class Problem

2017

The paper presents a possible enhancement of entropy-based classifiers to handle problems, caused by the class imbalance in the original dataset. The proposed method was tested on synthetic data in order to analyse its robustness in the controlled environment with different class proportions. As also the proposed method was tested on the real medical data with imbalanced classes and compared to the original classification algorithm results. The medical field was chosen for testing due to frequent situations with uneven class ratios.

Computer scienceEntropy (statistical thermodynamics)business.industryDecision treePattern recognition02 engineering and technologycomputer.software_genre01 natural sciencesSynthetic data010305 fluids & plasmasEntropy (classical thermodynamics)0103 physical sciences0202 electrical engineering electronic engineering information engineeringGeneral Earth and Planetary SciencesEntropy (information theory)020201 artificial intelligence & image processingArtificial intelligenceData miningEntropy (energy dispersal)businessEntropy (arrow of time)computerGeneral Environmental ScienceEntropy (order and disorder)Procedia Computer Science
researchProduct

Are nonlinear model-free conditional entropy approaches for the assessment of cardiac control complexity superior to the linear model-based one?

2016

Objective : We test the hypothesis that the linear model-based (MB) approach for the estimation of conditional entropy (CE) can be utilized to assess the complexity of the cardiac control in healthy individuals. Methods : An MB estimate of CE was tested in an experimental protocol (i.e., the graded head-up tilt) known to produce a gradual decrease of cardiac control complexity as a result of the progressive vagal withdrawal and concomitant sympathetic activation. The MB approach was compared with traditionally exploited nonlinear model-free (MF) techniques such as corrected approximate entropy, sample entropy, corrected CE, two k -nearest-neighbor CE procedures and permutation CE. Electroca…

Computer scienceEntropyBiomedical EngineeringSensitivity and Specificity01 natural sciencesApproximate entropy03 medical and health sciencesEntropy (classical thermodynamics)0302 clinical medicineHeart RateHeart Rate Determination0103 physical sciencesStatisticsHumansEntropy (information theory)Autonomic nervous systemComputer SimulationEntropy (energy dispersal)010306 general physicsEntropy (arrow of time)Heart rate variabilityFeedback PhysiologicalConditional entropyEntropy (statistical thermodynamics)Head-up tiltModels CardiovascularLinear modelCardiovascular regulationReproducibility of ResultsHeartStatistical modelMutual informationSample entropyMutual informationNonlinear DynamicsConcomitantSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaLinear ModelsAlgorithmRandom variableAlgorithms030217 neurology & neurosurgeryEntropy (order and disorder)
researchProduct